VMM + WTA Embedded Classifiers Learning Algorithm implementable on SoC FPAA devices

نویسندگان

  • Jennifer Hasler
  • Sahil Shah
چکیده

This paper presents a learning algorithm for a VMM + WTA classifier one layer architecture on a Large-Scale Field Programmable Analog Array (FPAA). The technique enables opportunities for embedded, ultra-low power machine learning, techniques typically considered for large servers. To develop this training algorithm, the paper starts by understanding fundamental equivalent transformations for VMM +WTA classifier networks. A VMM + WTA structure can exactly compute a SelfOrganizing Map (SOM) or Vector Quantization (VQ) operation, in addition to other transformations. SOM, VQ, and Gaussian Mixture Models (GMM) learning concepts are utilized for the training algorithm of this single one-layer network. An onchip clustering step determines the initial weight set for ideal target and background values. Null symbols are important for the algorithm and are set from midpoints of the target values. The results are shown both as numerical simulation of the VMM+WTA learning network, illustrating some numerical ODE simulation limitations for this problem, as well as experimental measurements implemented on an SoC FPAA device. This paper focuses on training of classifiers for a single layer of a Vector-Matrix Multiplier (VMM) and a single layer of a k-Winner-Take-All (WTA) [1], built on our original foundational work on VMM + WTA classifiers being demonstrated experimentally as a universal approximator [2], and recent work demonstrating a single engineer-tuned example of wordspotting classification in the SoC large-scale Field Programmable Analog Array (FPAA) [3]. The experimental demonstration of the universal approximation concept [2], based on Maass’ theoretically description a decade earlier [4], encourages the use of one-layer (or multiple layer) networks for embedded low-power classification. A universal approximator is a classifier that can represent any static function (with infinite resources) in a single layer of processing. An automatic technique of learning the VMM classifier weights based on representative data sets is far preferred to hand tuning a particular classifier for a given application. This work develops a VMM+WTA training algorithm capable of the universal approximator functionality, and demonstrating the learning both in numerical simulation as well as experimentally in an SoC FPAA. The algorithm was developed through understanding the connections of VMM+WTA classifier to Self Organizing Maps (SOM) [5], [6], [7], Vector Quantization (VQ) and Learning VQ [8], [9], [10], [11], [12], [13], Gaussian Mixture Models (GMM) [14], and Support Vector Machine (SVM) capabilities [15]. Training multilayer networks, typically required for universal approximation, often has training issues due to error estimations in all but the last classificatier layer; avoiding this issue is central is most neural The authors are with the School of Electrical and Computer Engineering (ECE), Georgia Institute of Technology, Atlanta, GA 30332-250 USA (email:[email protected]). FPAA Fabric Analog ASP & Classifier (VMM + WTA) 16bit Processor Input

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

SoC FPAA Hardware Implementation of a VMM+WTA Embedded Learning Classifier

This paper focuses on the circuit aspects required for an on-chip, on-line SoC large-scale Field Programmable Analog Array (FPAA) learning for Vector-Matrix Multiplier (VMM) + Winner-Take-All (WTA) classifier structure. We start by describing the VMM+WTA classifier structure, and then show techniques required to handle device mismatch. The approach is initially explained using a VMM+WTA as a tw...

متن کامل

Learning for VMM + WTA Embedded Classifiers

The authors present training and feedforward computation for a single layer of a VMM+WTA classifier. The experimental demonstration of the one-layer universal approximator encourages the use of one-layer networks for embedded low-power classification. The results enabling correct classification of each novel acoustic signal (generator, idle car, and idle truck). The classification structure req...

متن کامل

Embedded Memory Test Strategies and Repair

The demand of self-testing proportionally increases with memory size in System on Chip (SoC). SoC architecture normally occupies the majority of its area by memories. Due to increase in density of embedded memories, there is a need of self-testing mechanism in SoC design. Therefore, this research study focuses on this problem and introduces a smooth solution for self-testing.  In the proposed m...

متن کامل

Classifying with Gaussian Mixtures and Clusters

In this paper, we derive classifiers which are winner-take-all (WTA) approximations to a Bayes classifier with Gaussian mixtures for class conditional densities. The derived classifiers include clustering based algorithms like LVQ and k-Means. We propose a constrained rank Gaussian mixtures model and derive a WTA algorithm for it. Our experiments with two speech classification tasks indicate th...

متن کامل

Application of ensemble learning techniques to model the atmospheric concentration of SO2

In view of pollution prediction modeling, the study adopts homogenous (random forest, bagging, and additive regression) and heterogeneous (voting) ensemble classifiers to predict the atmospheric concentration of Sulphur dioxide. For model validation, results were compared against widely known single base classifiers such as support vector machine, multilayer perceptron, linear regression and re...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2018